Human Rademacher Complexity
نویسندگان
چکیده
We propose to use Rademacher complexity, originally developed in computational learning theory, as a measure of human learning capacity. Rademacher complexity measures a learner’s ability to fit random labels, and can be used to bound the learner’s true error based on the observed training sample error. We first review the definition of Rademacher complexity and its generalization bound. We then describe a “learning the noise” procedure to experimentally measure human Rademacher complexities. The results from empirical studies showed that: (i) human Rademacher complexity can be successfully measured, (ii) the complexity depends on the domain and training sample size in intuitive ways, (iii) human learning respects the generalization bounds, (iv) the bounds can be useful in predicting the danger of overfitting in human learning. Finally, we discuss the potential applications of human Rademacher complexity in cognitive science.
منابع مشابه
Human Algorithmic Stability and Human Rademacher Complexity
In Machine Learning (ML), the learning process of an algorithm given a set of evidences is studied via complexity measures. The way towards using ML complexity measures in the Human Learning (HL) domain has been paved by a previous study, which introduced Human Rademacher Complexity (HRC): in this work, we introduce Human Algorithmic Stability (HAS). Exploratory experiments, performed on a grou...
متن کاملCan machine learning explain human learning?
Learning Analytics (LA) has a major interest in exploring and understanding the learning process of humans and, for this purpose, benefits from both Cognitive Science, which studies how humans learn, and Machine Learning, which studies how algorithms learn from data. Usually, Machine Learning is exploited as a tool for analyzing data coming from experimental studies, but it has been recently ap...
متن کاملLecture 6: Rademacher Complexity
In this lecture, we discuss Rademacher complexity, which is a different (and often better) way to obtain generalization bounds for learning hypothesis classes.
متن کاملRademacher Complexity Margin Bounds for Learning with a Large Number of Classes
This paper presents improved Rademacher complexity margin bounds that scale linearly with the number of classes as opposed to the quadratic dependence of existing Rademacher complexity margin-based learning guarantees. We further use this result to prove a novel generalization bound for multi-class classifier ensembles that depends only on the Rademacher complexity of the hypothesis classes to ...
متن کاملRademacher Complexity
Rademacher complexity is a measure of the richness of a class of real-valued functions. In this sense, it is similar to the VC dimension. In fact, we will establish a uniform deviation bound in terms of Rademacher complexity, and then use this result to prove the VC inequality. Unlike VC dimension, however, Rademacher complexity is not restricted to binary functions, and will also prove useful ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2009